Fast Second Order Learning

نویسندگان

  • Stanislaw OSOWSKI
  • Piotr BOJARCZAK
  • Maciej STODOLSKI
چکیده

The paper presents the eecient training program of multilayer feedforward neu-ral networks. It is based on the best second order optimization algorithms including variable metric and conjugate gradient as well as application of directional minimization in each step. Its eeciency is proved on the standard tests, including parity, dichotomy, logistic and 2-spiral problems. The application of the algorithm to the solution of higher dimensionality problems like deconvolution, separation of sources and identiication of nonlinear dynamic plant are also given and discussed. It is shown that the appropriatly trained neural network can be used to the nonconven-tional solution of these standard signal processing tasks with satisfactory accuracy. The results of numerical experiments are included and discussed in the paper.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Improved Particle Swarm Optimizer Based on a Novel Class of Fast and Efficient Learning Factors Strategies

The particle swarm optimizer (PSO) is a population-based metaheuristic optimization method that can be applied to a wide range of problems but it has the drawbacks like it easily falls into local optima and suffers from slow convergence in the later stages. In order to solve these problems, improved PSO (IPSO) variants, have been proposed. To bring about a balance between the exploration and ex...

متن کامل

Working Set Selection Using Second Order Information for Training Support Vector Machines

Working set selection is an important step in decomposition methods for training support vector machines (SVMs). This paper develops a new technique for working set selection in SMO-type decomposition methods. It uses second order information to achieve fast convergence. Theoretical properties such as linear convergence are established. Experiments demonstrate that the proposed method is faster...

متن کامل

Fast NP Chunking Using Memory-Based Learning Techniques

In this paper we discuss the application of Memory-Based Learning (MBL) to fast NP chunking. We first discuss the application of a fast decision tree variant of MBL (IGTree) on the dataset described in (Ramshaw and Marcus, 1995), which consists of roughly 50,000 test and 200,000 train items. In a second series of experiments we used an architecture of two cascaded IGTrees. In the second level o...

متن کامل

Online Streaming Feature Selection Using Geometric Series of the Adjacency Matrix of Features

Feature Selection (FS) is an important pre-processing step in machine learning and data mining. All the traditional feature selection methods assume that the entire feature space is available from the beginning. However, online streaming features (OSF) are an integral part of many real-world applications. In OSF, the number of training examples is fixed while the number of features grows with t...

متن کامل

Input Fast-Forwarding for Better Deep Learning

This paper introduces a new architectural framework, known as input fast-forwarding, that can enhance the performance of deep networks. The main idea is to incorporate a parallel path that sends representations of input values forward to deeper network layers. This scheme is substantially different from “deep supervision,” in which the loss layer is re-introduced to earlier layers. The parallel...

متن کامل

Damage identification of structures using second-order approximation of Neumann series expansion

In this paper, a novel approach proposed for structural damage detection from limited number of sensors using extreme learning machine (ELM). As the number of sensors used to measure modal data is normally limited and usually are less than the number of DOFs in the finite element model, the model reduction approach should be used to match with incomplete measured mode shapes. The second-order a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1996